14 research outputs found

    Spatiotemporal Calibration of Electron Microscopes

    Get PDF

    Magnification-continuous static calibration model of a scanning-electron microscope.

    Get PDF
    International audienceWe present a new calibration model of both static distortion and projection for a scanning-electron microscope (SEM). The proposed calibration model depends continuously on the magnification factor. State-of-the-art methods have proposed models to solve the static distortion and projection model but for a discrete set of low and high magnifications: at low magnifications, existing models assume static distortion and perspective projection. At high magnifications, existing models assume an orthogonal projection without presence of static distortion. However, a magnification-continuous model which defines continuous switch from low to high magnifications has not yet been proposed. We propose a magnification-continuous static calibration model of the SEM. The static distortion and intrinsics of the projection matrix are modeled by partial differential equations (PDEs) with respect to magnification. The approach is applied with success to the JEOL-JSM 820 in a secondary electron imaging mode for magnification ranging from 100Ă— to 10kĂ—. The final RMS reprojection error is about 0.9 pixels. This result together with two application-based experiments: the consistent measurements of the bending of a cantilever and a 3-D reconstruction of a nano-ball emphasize the relevance of the proposed approach

    Toward fast calibration of global drift in scanning electron microscopes with respect to time and magnification.

    No full text
    International audienceIt is a well-known fact that scanning electron microscopic (SEM) image acquisition is mainly affected by nonlinearities and instabilities of the column and probe-specimen interaction; in turn, producing a shift in the image points with respect to many parameters and time, in particular. Even though this drift is comparatively less in modern SEMs, it is still an important factor to consider in most of the SEM-based applications. In this airticle, a simple and real-time method is proposed to estimate the global drift from a set of target images using image phase correlation, and to model its evolution by using the recursive equations of time and magnification. Based on the developed model, it is opted to use a Kalman filter in real time for accurate estimation and removal of the drift from the images. The developed method is tested using the images from a tungsten filament gun SEM (Jeol JSM 820) and a field effect gun SEM (FEI Quanta 200). The derived results show the effectiveness of the developed algorithm and also demonstrates its ability to be used in robotics as well as in material characterization under SEM

    Planification et exécution de mouvements référencés sur des amers

    No full text
    Robot motion execution is a difficult task for mainly three reasons. The first one comes from the high complexity of the path planning problem. The second one resides in the inaccuracy of the map of the environment used to plan the trajectory and the third reason is that the motion task in cluttered environments requires precise localization. The first of these three issues has raised a lot of interest for the past fifteen years and solutions have been proposed to solve the path planning problem for simple or complex kinematic systems. Our work deals with the two last issues. The generic approach we propose aims at producing motion features composed of a reference trajectory and a set of sensor-landmark pairs. These motion features define along the planned trajectory closed-loop motion strategies for the robot. We develop a generic strategy planning algorithms within a software platform to select the most relevent landmarks. These strategies have to take into account obstacles that represent a danger of collision and landmarks that yield a good localization. Experimental results on-board mobile robot Hilare~2 towing a trailer validate our approach as well for maneuvers like parking as for navigation task in cluttered environment.Planifier un chemin géométrique pour un robot d'une configuration initiale à une configuration initiale est aujourd'hui un problème quasiment résolu moyennant une représentation géométrique de l'environnement statique du robot, une modélisation de la chaîne cinématique du robot et de ses contraintes cinématiques. L'exécution de tels chemins en environnement réel est en revanche un problème qui est loin d'être résolu malgré une littérature fournie sur le sujet. De nombreuses raisons expliquent cette difficulté parmi lesquelles l'inexactitude des modèles d'environnement utilisés et des moyens de localisation. L'objectif de notre travail est de proposer une approche générique de planification de mouvements référencés sur des amers. Le principe de notre approche consiste à associer à une trajectoire géométrique sans collision des couples amers-capteurs qui pendant l'exécution sont utilisés pour asservir localement le mouvement du robot. Cette approche nous permet de produire des mouvements sûrs en donnant plus d'importance aux amers qui ont de bonnes propriétés de localisation ou bien à ceux qui représentent un danger de collision. Des résultats expérimentaux réalisés sur un robot mobile non holonome de type Hilare avec remorque valident notre approche

    A General Framework for Planning Landmark-Based Motions for Mobile Robots

    No full text
    International audienceOur work is focused on defining a generic approach for planning landmark based motion. The proposed geometric approach deals with robots of complex kinematic moving in cluttered environment. It aims at providing safe strategy of motion for such constrained conditions. The proposed method selects automatically the most relevant landmarks along a pre-planned geometric path. It proposes a strategy to correct the trajectory and to smoothly switch among the landmarks of the environment. Experimental results highlight the relevance of the proposed formalism

    Deep neural network architecture for automated soft surgical skills evaluation using objective structured assessment of technical skills criteria

    No full text
    International audiencePURPOSE: Classic methods of surgery skills evaluation tend to classify the surgeon performance in multi-categorical discrete classes. If this classification scheme has proven to be effective, it does not provide in-between evaluation levels. If these intermediate scoring levels were available, they would provide more accurate evaluation of the surgeon trainee. METHODS: We propose a novel approach to assess surgery skills on a continuous scale ranging from 1 to 5. We show that the proposed approach is flexible enough to be used either for scores of global performance or several sub-scores based on a surgical criteria set called Objective Structured Assessment of Technical Skills (OSATS). We established a combined CNN+BiLSTM architecture to take advantage of both temporal and spatial features of kinematic data. Our experimental validation relies on real-world data obtained from JIGSAWS database. The surgeons are evaluated on three tasks: Knot-Tying, Needle-Passing and Suturing. The proposed framework of neural networks takes as inputs a sequence of 76 kinematic variables and produces an output float score ranging from 1 to 5, reflecting the quality of the performed surgical task. RESULTS: Our proposed model achieves high-quality OSATS scores predictions with means of Spearman correlation coefficients between the predicted outputs and the ground-truth outputs of 0.82, 0.60 and 0.65 for Knot-Tying, Needle-Passing and Suturing, respectively. To our knowledge, we are the first to achieve this regression performance using the OSATS criteria and the JIGSAWS kinematic data. CONCLUSION: An effective deep learning tool was created for the purpose of surgical skills assessment. It was shown that our method could be a promising surgical skills evaluation tool for surgical training programs

    Planification et exécution de mouvements référencés sur des amers

    No full text
    TOULOUSE3-BU Sciences (315552104) / SudocSudocFranceF
    corecore